翻訳と辞書
Words near each other
・ Vanished (1995 film)
・ Vanished (2009 film)
・ Vanished (Crystal Castles song)
・ Vanished (disambiguation)
・ Vanished Children's Alliance
・ Vanished Planet
・ Vanisher
・ Vanishing
・ Vanishing Act
・ Vanishing Act (The Outer Limits)
・ Vanishing Africa
・ Vanishing bile duct syndrome
・ Vanishing bird cage
・ Vanishing cycle
・ Vanishing dimensions theory
Vanishing gradient problem
・ Vanishing Hand
・ Vanishing hitchhiker
・ Vanishing island
・ Vanishing Lessons
・ Vanishing mediator
・ Vanishing of the Bees
・ Vanishing on 7th Street
・ Vanishing point
・ Vanishing Point (1971 film)
・ Vanishing Point (1997 film)
・ Vanishing Point (2012 film)
・ Vanishing Point (band)
・ Vanishing Point (CBC)
・ Vanishing point (disambiguation)


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Vanishing gradient problem : ウィキペディア英語版
Vanishing gradient problem

In machine learning, the vanishing gradient problem is a difficulty found in training artificial neural networks with gradient-based learning methods and backpropagation. In such methods, each of the neural network's weights receives an update proportional to the gradient of the error function with respect to the current weight in each iteration of training. Traditional activation functions such as the hyperbolic tangent function have gradients in the range or , and backpropagation computes gradients by the chain rule. This has the effect of multiplying of these small numbers to compute gradients of the "front" layers in an -layer network, meaning that the gradient (error signal) decreases exponentially with and the front layers train very slowly.
With the advent of the back-propagation algorithm in the 1970s, many researchers tried to train supervised deep artificial neural networks from scratch, initially with little success. Sepp Hochreiter's diploma thesis of 1991〔S. Hochreiter. Untersuchungen zu dynamischen neuronalen Netzen. Diploma thesis, Institut f. Informatik, Technische Univ. Munich, 1991. Advisor: J. Schmidhuber〕〔S. Hochreiter, Y. Bengio, P. Frasconi, and J. Schmidhuber. Gradient flow in recurrent nets: the difficulty of learning long-term dependencies. In S. C. Kremer and J. F. Kolen, editors, A Field Guide to Dynamical Recurrent Neural Networks. IEEE Press, 2001.〕 formally identified the reason for this failure in the "vanishing gradient problem," which not only affects many-layered feedforward networks, but also recurrent neural networks. The latter are trained by unfolding them into very deep feedforward networks, where a new layer is created for each time step of an input sequence processed by the network.
When activation functions are used whose derivatives can take on larger values, one risks encountering the related ''exploding gradient problem''.
== Solutions ==


抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Vanishing gradient problem」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.